# Korean Pre-training
Kobert Lm
Apache-2.0
KoBERT-LM is a pre-trained language model optimized for Korean, based on the BERT architecture and further pre-trained specifically for Korean text.
Large Language Model Korean
K
monologg
49
1
Roberta Ko Small
Apache-2.0
A compact Korean RoBERTa model trained under the LASSL framework, suitable for various Korean natural language processing tasks.
Large Language Model
Transformers Korean

R
lassl
17
2
Featured Recommended AI Models